In mathematical optimization, the Rosenbrock function is a non-convex function used as a performance test problem for optimization algorithms introduced by Howard H. Rosenbrock in 1960[1]. It is also known as Rosenbrock's valley or Rosenbrock's banana function.
The global minimum is inside a long, narrow, parabolic shaped flat valley. To find the valley is trivial. To converge to the global minimum, however, is difficult.
It is defined by
It has a global minimum at where . A different coefficient of the second term is sometimes given, but this does not affect the position of the global minimum.
Contents |
Two variants are commonly encountered. One is the sum of uncoupled 2D Rosenbrock problems,
This variant is only defined for even and has predictably simple solutions.
A more involved variant is
This variant has been shown to have exactly one minimum for (at ) and exactly two minima for -- the global minimum of all ones and a local minimum near . This result is obtained by setting the gradient of the function equal to zero, noticing that the resulting equation is a rational function of . For small the polynomials can be determined exactly and Sturm's theorem can be used to determine the number of real roots, while the roots can be bounded in the region of [4]. For larger this method breaks down due to the size of the coefficients involved.
Many of the stationary points of the function exhibit a regular pattern when plotted[4]. This structure can be exploited to locate them.
There are many ways to extend this function stochastically. In Xin-She Yang's functions, a generic or heuristic extension of Rosenbrock's function is given to extend this function into a stochastic function[5]
where the random variables obey a uniform distribution Unif(0,1). In principle, this stochastic function has the same global optimimum at (1,1,...,1), however, the stochastic nature of this function makes it impossible to use any gradient-based optimization algorithms.